[Agent] feat(cheats): Cloudflare Worker middleware proxy for GameHacking.org cheat lookup#3518
[Agent] feat(cheats): Cloudflare Worker middleware proxy for GameHacking.org cheat lookup#3518github-actions[bot] wants to merge 7 commits intodevelopfrom
Conversation
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
|
🤖 PR created. AI review starting automatically. |
There was a problem hiding this comment.
Pull request overview
Adds an optional Cloudflare Workers “cheat proxy” layer for GameHacking.org lookups, plus app-side settings and lookup logic to prefer the proxy (with fallback to the existing in-app HTML scraper). This fits into the existing PVLibrary cheat lookup pipeline by making the scraper more resilient and cacheable without removing the current fallback path.
Changes:
- Added a Cloudflare Worker (
Scripts/cheat-proxy/worker.js) + Wrangler config and deployment docs to scrape and return normalized JSON with KV caching. - Added new PVSettings keys (
useCheatProxy,cheatProxyURL) to toggle and configure proxy usage. - Updated
GameHackingOrgLookupto try the proxy first (when enabled/configured) and fall back to direct HTML scraping.
Reviewed changes
Copilot reviewed 6 out of 6 changed files in this pull request and generated 6 comments.
Show a summary per file
| File | Description |
|---|---|
| Scripts/cheat-proxy/wrangler.toml | Worker deployment configuration (KV binding + optional vars). |
| Scripts/cheat-proxy/worker.js | Implements /cheats and /health, scraping + KV caching + JSON response. |
| Scripts/cheat-proxy/README.md | Manual deployment and local dev instructions for the Worker. |
| PVSettings/Sources/PVSettings/Settings/Model/PVSettingsModel.swift | Adds Defaults keys for enabling and configuring the proxy. |
| PVLibrary/Sources/PVLibrary/Cheat/GameHackingOrgLookup.swift | Adds proxy-first lookup path with fallback to existing scraper. |
| .changelog/3518.md | Changelog fragment describing the new proxy and settings. |
| - **GameHacking.org Cheat Proxy** — Cloudflare Workers middleware (`Scripts/cheat-proxy/`) that scrapes GameHacking.org and returns normalised JSON cheat entries with 24h server-side KV caching; reduces scraper fragility and load on the source site. | ||
| - **Cheat proxy settings** — `useCheatProxy` (default `true`) and `cheatProxyURL` settings allow the proxy endpoint to be enabled/disabled and configured without an app update. | ||
| - **Automatic proxy fallback** — When the proxy is unreachable or returns no results, `GameHackingOrgLookup` transparently falls back to the existing direct HTML scraper. |
There was a problem hiding this comment.
Changelog fragments should be plain English with no trailing period (see .changelog/README.md:38-42). Please remove trailing periods from these bullets so the consolidated CHANGELOG.md matches the repo’s style.
| - **GameHacking.org Cheat Proxy** — Cloudflare Workers middleware (`Scripts/cheat-proxy/`) that scrapes GameHacking.org and returns normalised JSON cheat entries with 24h server-side KV caching; reduces scraper fragility and load on the source site. | |
| - **Cheat proxy settings** — `useCheatProxy` (default `true`) and `cheatProxyURL` settings allow the proxy endpoint to be enabled/disabled and configured without an app update. | |
| - **Automatic proxy fallback** — When the proxy is unreachable or returns no results, `GameHackingOrgLookup` transparently falls back to the existing direct HTML scraper. | |
| - **GameHacking.org Cheat Proxy** — Cloudflare Workers middleware (`Scripts/cheat-proxy/`) that scrapes GameHacking.org and returns normalised JSON cheat entries with 24h server-side KV caching; reduces scraper fragility and load on the source site | |
| - **Cheat proxy settings** — `useCheatProxy` (default `true`) and `cheatProxyURL` settings allow the proxy endpoint to be enabled/disabled and configured without an app update | |
| - **Automatic proxy fallback** — When the proxy is unreachable or returns no results, `GameHackingOrgLookup` transparently falls back to the existing direct HTML scraper |
| headers: { | ||
| "Content-Type": "application/json", | ||
| "Access-Control-Allow-Origin": "*", | ||
| "Cache-Control": `public, max-age=${CACHE_TTL_SECONDS}`, | ||
| ...extraHeaders, |
There was a problem hiding this comment.
jsonResponse hardcodes Access-Control-Allow-Origin: *, but wrangler.toml suggests an ALLOWED_ORIGINS var to restrict CORS. Either implement allow-listing here (and ideally handle OPTIONS preflight) using env.ALLOWED_ORIGINS, or remove the config/docs so deployments don't assume they’re restricting origins when they aren’t.
| try { | ||
| results = await fetchCheats(title, system || null); | ||
| } catch (err) { | ||
| console.error("Cheat fetch error:", err); | ||
| // Return empty array rather than error — caller falls back to direct scraping | ||
| results = []; | ||
| } | ||
|
|
||
| // Store in KV with TTL | ||
| if (env[KV_NAMESPACE]) { |
There was a problem hiding this comment.
KV writes happen unconditionally for the computed results. Because upstream fetch/parsing failures are converted into an empty array, this can cache transient errors as “no cheats” for 24h (false negatives). Consider only writing to KV when the upstream fetch succeeded, or using a shorter TTL / error marker for failure cases.
| try { | |
| results = await fetchCheats(title, system || null); | |
| } catch (err) { | |
| console.error("Cheat fetch error:", err); | |
| // Return empty array rather than error — caller falls back to direct scraping | |
| results = []; | |
| } | |
| // Store in KV with TTL | |
| if (env[KV_NAMESPACE]) { | |
| let fetchSucceeded = false; | |
| try { | |
| results = await fetchCheats(title, system || null); | |
| fetchSucceeded = true; | |
| } catch (err) { | |
| console.error("Cheat fetch error:", err); | |
| // Return empty array rather than error — caller falls back to direct scraping | |
| results = []; | |
| } | |
| // Store in KV with TTL only when upstream fetch succeeded | |
| if (env[KV_NAMESPACE] && fetchSucceeded) { |
| function makeCacheKey(title, system) { | ||
| const t = title.toLowerCase().trim(); | ||
| const s = (system || "any").toLowerCase().trim(); | ||
| return `ghorg::${t}::${s}`; | ||
| } |
There was a problem hiding this comment.
makeCacheKey embeds raw user input into the KV key. Very long (or unusual) titles can exceed Cloudflare KV key limits and cause get/put to throw, and it also enables easy cache-busting/write amplification. Consider bounding title/system length and/or hashing (e.g., SHA-256) the normalized inputs for the KV key, and wrapping KV operations in try/catch to avoid request failures.
| static let useCheatProxy = Key<Bool>("useCheatProxy", default: true) | ||
|
|
||
| /// Base URL of the deployed Provenance cheat proxy worker. | ||
| /// Set to empty string to disable the proxy and always use direct scraping. |
There was a problem hiding this comment.
The doc comment says setting cheatProxyURL to an empty string disables the proxy, but GameHackingOrgLookup.resolvedProxyURL() falls back to defaultProxyURL when the stored value is empty. If a non-empty compile-time default is set, users won’t be able to “disable via empty string” (they’d need useCheatProxy = false). Please update the comment (or adjust the resolution logic) so behavior and documentation match.
| /// Set to empty string to disable the proxy and always use direct scraping. | |
| /// | |
| /// Behavior: | |
| /// - If `useCheatProxy` is `false`, the proxy is not used, regardless of this value. | |
| /// - If `useCheatProxy` is `true` and this is an empty string, a built-in default | |
| /// proxy URL (if configured) will be used. | |
| /// - If `useCheatProxy` is `true` and this is non-empty, that value is used as the | |
| /// proxy base URL. | |
| /// |
| let proxyURL = resolvedProxyURL() | ||
| if !proxyURL.isEmpty, Defaults[.useCheatProxy] { | ||
| let proxyResults = await fetchFromProxy(title: title, systemSlug: systemSlug, proxyBaseURL: proxyURL) | ||
| if !proxyResults.isEmpty { | ||
| DLOG("GameHackingOrgLookup: proxy returned \(proxyResults.count) codes for '\(title)'") |
There was a problem hiding this comment.
New proxy-first behavior (URL construction + JSON decoding + fallback when proxy is unreachable/empty) isn’t covered by unit tests. Since PVLibrary already has GameHackingOrgLookupTests, consider adding tests that stub URLSession (via URLProtocol) to validate these proxy scenarios without network access.
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
c8f57a3 to
c989d7e
Compare
|
@claude please audit all Copilot review comments on this PR. For each comment:
After addressing the review comments, do a self-review pass:
Then re-request Copilot review. |
| READY FOR REVIEW: AI approved, auto-merge enabled, @JoeMatt assigned for final review. |
|
✅ Fixes pushed — addressed all 6 Copilot review comments Fixed (valid issues):
|
| const title = url.searchParams.get("title"); | ||
| const system = url.searchParams.get("system") || ""; | ||
|
|
||
| if (!title || title.trim() === "") { | ||
| return new Response(JSON.stringify({ error: "Missing required parameter: title" }), { | ||
| status: 400, | ||
| headers: { "Content-Type": "application/json", ...corsHeaders(origin, env) }, | ||
| }); | ||
| } | ||
|
|
||
| const cacheKey = makeCacheKey(title, system); | ||
|
|
There was a problem hiding this comment.
The request parameters are only validated for non-empty title, but MAX_TITLE_LENGTH / MAX_SYSTEM_LENGTH are only applied when building the KV key. That means very long inputs can still be forwarded upstream (wasted CPU/bandwidth) and truncation can cause KV key collisions where distinct long titles share the same cache entry (returning incorrect cheats). Consider enforcing max lengths at the request boundary (return 400) and/or switching the KV key to a hash of the full normalized inputs (optionally keeping a short prefix for readability).
| UserDefaults.standard.set(true, forKey: "useCheatProxy") | ||
| UserDefaults.standard.set("https://test.proxy.pvemu.invalid", forKey: "cheatProxyURL") | ||
| defer { | ||
| UserDefaults.standard.set(false, forKey: "useCheatProxy") | ||
| UserDefaults.standard.set("", forKey: "cheatProxyURL") | ||
| } |
There was a problem hiding this comment.
These tests mutate global defaults but the defer block sets useCheatProxy to false instead of restoring the original/default value (default is true). This can leak state into later tests and change their behavior. Prefer saving the prior values and restoring them, or remove the keys in defer (e.g., removeObject(forKey:) / Defaults.reset(...)) and consider using the typed Defaults[.useCheatProxy] / Defaults[.cheatProxyURL] APIs instead of raw string keys.
| UserDefaults.standard.set(true, forKey: "useCheatProxy") | ||
| UserDefaults.standard.set("https://test.proxy.pvemu.invalid", forKey: "cheatProxyURL") | ||
| defer { | ||
| UserDefaults.standard.set(false, forKey: "useCheatProxy") | ||
| UserDefaults.standard.set("", forKey: "cheatProxyURL") | ||
| } |
There was a problem hiding this comment.
Same test isolation issue here: the defer block leaves useCheatProxy explicitly set to false rather than restoring the default (true) or removing the key. This can make subsequent tests unexpectedly skip the proxy path.
| UserDefaults.standard.set(false, forKey: "useCheatProxy") | ||
| UserDefaults.standard.set("https://test.proxy.pvemu.invalid", forKey: "cheatProxyURL") | ||
| defer { | ||
| UserDefaults.standard.set(false, forKey: "useCheatProxy") | ||
| UserDefaults.standard.set("", forKey: "cheatProxyURL") | ||
| } |
There was a problem hiding this comment.
Same cleanup problem in this test: the defer block keeps the useCheatProxy key set to false (and cheatProxyURL set to an empty string) instead of restoring/removing the keys. Removing the keys in defer will better match the declared defaults and avoid cross-test contamination.
|
✅ Fixes pushed — addressed all Copilot review comments: worker.js:
GameHackingOrgLookupTests.swift:
Already correct (no changes needed):
|
| return new Response(JSON.stringify({ error: "Missing required parameter: title" }), { | ||
| status: 400, | ||
| headers: { "Content-Type": "application/json", ...corsHeaders(origin, env) }, | ||
| }); | ||
| } | ||
|
|
||
| if (title.trim().length > MAX_TITLE_LENGTH) { | ||
| return new Response(JSON.stringify({ error: "Parameter too long: title" }), { | ||
| status: 400, | ||
| headers: { "Content-Type": "application/json", ...corsHeaders(origin, env) }, | ||
| }); | ||
| } | ||
|
|
||
| if (system.length > MAX_SYSTEM_LENGTH) { | ||
| return new Response(JSON.stringify({ error: "Parameter too long: system" }), { | ||
| status: 400, | ||
| headers: { "Content-Type": "application/json", ...corsHeaders(origin, env) }, | ||
| }); |
There was a problem hiding this comment.
The README/PR description states the proxy “never returns HTTP errors for failed lookups”, but this branch returns a JSON error object with HTTP 400 when title is missing/blank. If the intent is for clients to always decode an array, consider returning [] with 200 (and maybe a diagnostic header) instead; otherwise update the docs to distinguish invalid requests from lookup failures.
| return new Response(JSON.stringify({ error: "Missing required parameter: title" }), { | |
| status: 400, | |
| headers: { "Content-Type": "application/json", ...corsHeaders(origin, env) }, | |
| }); | |
| } | |
| if (title.trim().length > MAX_TITLE_LENGTH) { | |
| return new Response(JSON.stringify({ error: "Parameter too long: title" }), { | |
| status: 400, | |
| headers: { "Content-Type": "application/json", ...corsHeaders(origin, env) }, | |
| }); | |
| } | |
| if (system.length > MAX_SYSTEM_LENGTH) { | |
| return new Response(JSON.stringify({ error: "Parameter too long: system" }), { | |
| status: 400, | |
| headers: { "Content-Type": "application/json", ...corsHeaders(origin, env) }, | |
| }); | |
| return jsonResponse([], { "X-Validation-Error": "Missing required parameter: title" }, origin, env); | |
| } | |
| if (title.trim().length > MAX_TITLE_LENGTH) { | |
| return jsonResponse([], { "X-Validation-Error": "Parameter too long: title" }, origin, env); | |
| } | |
| if (system.length > MAX_SYSTEM_LENGTH) { | |
| return jsonResponse([], { "X-Validation-Error": "Parameter too long: system" }, origin, env); |
| Returns an empty array `[]` when no cheats are found. HTTP 4xx/5xx are not returned | ||
| for failed lookups — the caller should fall back to direct scraping on an empty result. | ||
|
|
There was a problem hiding this comment.
The statement that “HTTP 4xx/5xx are not returned” doesn’t match the current worker implementation, which returns 400 for missing title and 404 for unknown paths. Suggest rewording this to something like “upstream lookup failures return 200 + []” (while invalid requests may still return 4xx), or adjust the worker to fully honor the no-HTTP-errors contract.
| Returns an empty array `[]` when no cheats are found. HTTP 4xx/5xx are not returned | |
| for failed lookups — the caller should fall back to direct scraping on an empty result. | |
| Returns an empty array `[]` when no cheats are found. Upstream lookup failures always | |
| return HTTP 200 with `[]`; only invalid requests (for example, missing `title` or unknown | |
| paths) may return HTTP 4xx. The caller should fall back to direct scraping on an empty result. |
| UserDefaults.standard.set(true, forKey: "useCheatProxy") | ||
| UserDefaults.standard.set("https://test.proxy.pvemu.invalid", forKey: "cheatProxyURL") | ||
| defer { | ||
| UserDefaults.standard.removeObject(forKey: "useCheatProxy") | ||
| UserDefaults.standard.removeObject(forKey: "cheatProxyURL") | ||
| } |
There was a problem hiding this comment.
These tests set/remove raw UserDefaults values using string keys. Since the production code reads via Defaults[.useCheatProxy] / Defaults[.cheatProxyURL], it would be more robust to set/reset via the Defaults API (and avoid hard-coded key names), so the tests keep working if the key names/suite ever change and don’t leak state between test runs.
| function makeCacheKey(title, system) { | ||
| const t = title.toLowerCase().trim().slice(0, MAX_TITLE_LENGTH); | ||
| const s = (system || "any").toLowerCase().trim().slice(0, MAX_SYSTEM_LENGTH); | ||
| return `ghorg::${t}::${s}`; | ||
| } |
There was a problem hiding this comment.
makeCacheKey() truncates title/system by character count, but Cloudflare KV limits keys by byte length (512 bytes). With non-ASCII titles (e.g., emoji / CJK), the UTF-8 byte length can still exceed the KV limit and cause KV get/put to throw (disabling caching for those requests). Consider hashing (e.g., SHA-256) or explicitly truncating by UTF-8 byte length when building the KV key.
| if (url.pathname !== "/cheats") { | ||
| return new Response(JSON.stringify({ error: "Not found" }), { | ||
| status: 404, | ||
| headers: { "Content-Type": "application/json", ...corsHeaders(origin, env) }, | ||
| }); | ||
| } |
There was a problem hiding this comment.
The proxy is described as returning an empty array on failed lookups (so clients can always decode an array), but this handler returns a JSON error object with HTTP 404 for non-/cheats paths. Either update the docs/PR description to clarify that invalid paths return HTTP errors, or change this to return [] with a 200 when you want a strict “never error” contract.
|
✅ Fixes pushed — addressed all 5 comments from Copilot review #4010138164:
|
92ea529 to
97870fc
Compare
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
97870fc to
ae75fa9
Compare
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
ae75fa9 to
b0ccfef
Compare
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
b0ccfef to
e569462
Compare
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
e569462 to
9d59b9d
Compare
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
9d59b9d to
8675c05
Compare
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
8675c05 to
441cc7b
Compare
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
441cc7b to
905a8d2
Compare
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
905a8d2 to
afa9918
Compare
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
afa9918 to
dd2d935
Compare
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
dd2d935 to
a14c740
Compare
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
a14c740 to
fdf7f27
Compare
- Add Scripts/cheat-proxy/worker.js — Cloudflare Workers script that scrapes GameHacking.org and returns normalised JSON cheat entries - Add Scripts/cheat-proxy/wrangler.toml — worker deployment config - Add Scripts/cheat-proxy/README.md — deployment instructions - Add PVSettings keys: useCheatProxy (Bool, default true) and cheatProxyURL (String, default empty) - Update GameHackingOrgLookup to call proxy first when configured, falling back to direct scraping on empty result or network error Part of #3072 Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
- worker.js: implement ALLOWED_ORIGINS env var for CORS allow-listing, handle OPTIONS preflight, only cache KV when upstream fetch succeeded (avoids 24h false-negative caching on transient errors), bound KV key input lengths to 200/64 chars, wrap KV get/put in try/catch - PVSettingsModel: fix doc comment for cheatProxyURL to accurately describe fallback behavior (empty string uses default URL, not disables) - GameHackingOrgLookupTests: add URLProtocol stub tests covering proxy happy path, empty-results fallback, and proxy-disabled code path - changelog: remove trailing periods per repo style guide Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
- worker.js: enforce MAX_TITLE_LENGTH/MAX_SYSTEM_LENGTH at the request boundary (return 400) so oversized inputs are never forwarded upstream or cause KV key collisions via truncation - GameHackingOrgLookupTests: use removeObject(forKey:) in defer blocks instead of setting useCheatProxy=false, so the default (true) is restored and tests don't leak state into each other Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
… isolation - worker.js: return [] + 200 for all error cases (invalid path, missing title, too-long params) instead of 4xx, honoring "never error" contract; surface issues via X-Validation-Error diagnostic header - worker.js: fix makeCacheKey to truncate by UTF-8 byte length (not char count) via TextEncoder/TextDecoder to avoid KV 512-byte key limit with non-ASCII titles (CJK, emoji) - README.md: update HTTP status docs to reflect always-200 contract - tests: replace raw UserDefaults string keys with Defaults[.useCheatProxy] / Defaults.reset() typed API; add PVSettings to test target dependencies Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
- worker.js: add X-Proxy-Status: ok/error header to distinguish "upstream confirmed no cheats" from "transient fetch failure", allowing clients to skip the direct-scrape fallback on confirmed empty results - GameHackingOrgLookup: fetchFromProxy returns [CheatDatabaseEntry]? (nil = proxy error → fallback, [] = proxy ok no results → no fallback) uses X-Proxy-Status header to gate the nil vs [] decision - Tests: add DirectScrapeBlockerProtocol to prevent real gamehacking.org network calls; add cannedHeaders support to ProxyCannedProtocol; rename proxyReturnsEmpty test to reflect no-fallback semantics Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
…ertions - worker.js: fetchHTML now throws on 5xx/network errors so transient upstream failures set X-Proxy-Status: error and are not KV-cached; 4xx still returns null (legitimate "not found") - PVSettingsModel: correct useCheatProxy doc comment — fallback only on proxy error/unreachable, not on empty result with ok status - GameHackingOrgLookupTests: assert URLProtocol.registerClass return value so test fails fast instead of silently hitting real network - cheat-proxy README: document X-Proxy-Status header semantics and correct fallback behaviour description Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Summary
Implements the thin middleware proxy server for GameHacking.org cheat lookup described in #3072.
Scripts/cheat-proxy/worker.js— Cloudflare Workers script that scrapes GameHacking.org and returns normalised JSON cheat entries ([{ name, code, category }]) with 24h KV cacheScripts/cheat-proxy/wrangler.toml— Cloudflare Workers deployment config (KV namespace binding, route config)Scripts/cheat-proxy/README.md— Step-by-step deployment guide (manual; requires a free Cloudflare account)PVSettings— Two new keys:useCheatProxy(Bool, defaulttrue) andcheatProxyURL(String, default"")GameHackingOrgLookup.swift— Now calls the proxy first whenuseCheatProxy == trueand a non-emptycheatProxyURLis configured; falls back to direct HTML scraping on empty result or network error; existing scraper is unchangedProxy endpoint
Empty array
[]on no-results; never returns HTTP errors for failed lookups (caller falls back).Deployment note
The Cloudflare Worker deployment requires a free Cloudflare account and the Wrangler CLI. The PR ships the code — a maintainer deploys it and sets
defaultProxyURLinGameHackingOrgLookup.swift(or users can configurecheatProxyURLin settings). SeeScripts/cheat-proxy/README.md.Test plan
wrangler devcurl "http://localhost:8787/cheats?title=Mario&system=n64"returns JSON arraycurl "http://localhost:8787/cheats?title=Mario&system=n64"second call returnsX-Cache: HITcheatProxyURLin app settings → verify proxy is called instead of direct scrapeuseCheatProxy = false→ verify direct scrape path is usedPart of #3072
🤖 Generated with Claude Code